A Note on Metric Properties for Some Divergence Measures: The Gaussian Case

نویسندگان

  • Karim T. Abou-Moustafa
  • Frank P. Ferrie
چکیده

Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central operation that appears in most of these areas is to measure the difference between two multivariate Gaussians. Unfortunately, traditional measures based on the Kullback– Leibler (KL) divergence and the Bhattacharyya distance do not satisfy all metric axioms necessary for many algorithms. In this paper we propose a modification for the KL divergence and the Bhattacharyya distance, for multivariate Gaussian densities, that transforms the two measures into distance metrics. Next, we show how these metric axioms impact the unfolding process of manifold learning algorithms. Finally, we illustrate the efficacy of the proposed metrics on two different manifold learning algorithms when used for motion clustering in video data. Our results show that, in this particular application, the new proposed metrics lead to boosts in performance (at least 7%) when compared to other divergence measures.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A note on decision making in medical investigations using new divergence measures for intuitionistic fuzzy sets

Srivastava and Maheshwari (Iranian Journal of Fuzzy Systems 13(1)(2016) 25-44) introduced a new divergence measure for intuitionisticfuzzy sets (IFSs). The properties of the proposed divergence measurewere studied and the efficiency of the proposed divergence measurein the context of medical diagnosis was also demonstrated. In thisnote, we point out some errors in ...

متن کامل

Information Measures via Copula Functions

In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...

متن کامل

Decision making in medical investigations using new divergence measures for intuitionistic fuzzy sets

In recent times, intuitionistic fuzzy sets introduced by Atanassov has been one of the most powerful and flexible approaches for dealing with complex and uncertain situations of real world. In particular, the concept of divergence between intuitionistic fuzzy sets is important since it has applications in various areas such as image segmentation, decision making, medical diagnosis, pattern reco...

متن کامل

A Time-Domain Method for Shape Reconstruction of a Target with Known Electrical Properties (RESEARCH NOTE)

This paper uses a method for shape reconstruction of a 2-D homogeneous object with arbitrary geometry and known electrical properties. In this method, the object is illuminated by a Gaussian pulse, modulated with sinusoidal carrier plane wave and the time domains’ footprint signal due to object presence is used for the shape reconstruction. A nonlinear feedback loop is used to minimize the diff...

متن کامل

NEW RESULTS ON THE EXISTING FUZZY DISTANCE MEASURES

In this paper, we investigate the properties of some recently pro-posed fuzzy distance measures. We find out some shortcomings for these dis-tances and then the obtained results are illustrated by solving several examplesand compared with the other fuzzy distances.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012